Improving Monte Carlo randomized approximation schemes

نویسنده

  • Mark Huber
چکیده

Consider a central problem in randomized approximation schemes that use a Monte Carlo approach. Given a sequence of independent, identically distributed random variables X1, X2, . . . with mean μ and standard deviation at most cμ, where c is a known constant, and ǫ, δ > 0, create an estimate μ̂ for μ such that P(|μ̂ − μ| > ǫμ) ≤ δ. This technique has been used for building randomized approximation schemes for the volume of a convex body, the permanent of a nonnegative matrix, the number of linear extensions of a poset, the partition function of the Ising model and many other problems. Existing methods use (to the leading order) 19.35(c/ǫ) ln(δ) samples. This is the best possible number up to the constant factor, and it is an open question as to what is the best constant possible. This work gives an easy to apply estimate that only uses 6.96(c/ǫ) ln(δ) samples in the leading order.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quasi-Monte Carlo sampling to improve the efficiency of Monte Carlo EM

In this paper we investigate an efficient implementation of the Monte Carlo EM algorithm based on Quasi-Monte Carlo sampling. The Monte Carlo EM algorithm is a stochastic version of the deterministic EM (Expectation-Maximization) algorithm in which an intractable E-step is replaced by a Monte Carlo approximation. Quasi-Monte Carlo methods produce deterministic sequences of points that can signi...

متن کامل

ar X iv : 1 41 2 . 82 93 v 2 [ st at . M L ] 9 A ug 2 01 5 Quasi - Monte Carlo Feature Maps for Shift - Invariant Kernels ∗

We consider the problem of improving the efficiency of randomized Fourier feature maps to accelerate training and testing speed of kernel methods on large datasets. These approximate feature maps arise as Monte Carlo approximations to integral representations of shift-invariant kernel functions (e.g., Gaussian kernel). In this paper, we propose to use Quasi-Monte Carlo (QMC) approximations inst...

متن کامل

A Deterministic Polynomial-Time Approximation Scheme for Counting Knapsack Solutions

Given n elements with nonnegative integer weights w1, . . . , wn and an integer capacity C, we consider the counting version of the classic knapsack problem: find the number of distinct subsets whose weights add up to at most the given capacity. We give a deterministic algorithm that estimates the number of solutions to within relative error 1±ε in time polynomial in n and 1/ε (fully polynomial...

متن کامل

The Bias-Variance Dilemma of the Monte Carlo Method

We investigate the setting in which Monte Carlo methods are used and draw a parallel to the formal setting of statistical inference. In particular, we nd that Monte Carlo approximation gives rise to a bias-variance dilemma. We show that it is possible to construct a biased approximation schemes with a lower approximation error than a related unbiased algorithms.

متن کامل

Hierarchical fractional-step approximations and parallel kinetic Monte Carlo algorithms

We present a mathematical framework for constructing and analyzing parallel algorithms for lattice kinetic Monte Carlo (KMC) simulations. The resulting algorithms have the capacity to simulate a wide range of spatio-temporal scales in spatially distributed, non-equilibrium physiochemical processes with complex chemistry and transport micro-mechanisms. Rather than focusing on constructing exactl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1411.4074  شماره 

صفحات  -

تاریخ انتشار 2014